Intra-class Adaptive Augmentation with Neighbor Correction for Deep Metric Learning
نویسندگان
چکیده
Deep metric learning aims to learn an embedding space, where semantically similar samples are close together and dissimilar ones repelled against. To explore more hard informative training signals for augmentation generalization, recent methods focus on generating synthetic boost losses. However, these just use the deterministic class-independent generations (e.g., simple linear interpolation), which only can cover limited part of distribution spaces around original samples. They have overlooked wide characteristic changes different classes not model abundant intra-class variations generations. Therefore, generated lack rich semantics within certain class, but also might be noisy disturb training. In this paper, we propose a novel adaptive (IAA) framework deep learning. We reasonably estimate every class generate support mining Further, most datasets that few neighbor correction revise inaccurate estimations, according our correlation discovery generally variation distributions. Extensive experiments five benchmarks show method significantly improves outperforms state-of-the-art retrieval performances by 3%-6%. Our code is available at https://github.com/darkpromise98/IAA .
منابع مشابه
BoostML: An Adaptive Metric Learning for Nearest Neighbor Classification
The nearest neighbor classification/regression technique, besides its simplicity, is one of the most widely applied and well studied techniques for pattern recognition in machine learning. A nearest neighbor classifier assumes class conditional probabilities to be locally smooth. This assumption is often invalid in high dimensions and significant bias can be introduced when using the nearest ne...
متن کاملAdaptive Metric nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chisqu...
متن کاملLocally Adaptive Metric Nearest Neighbor Classiication
Nearest neighbor classiication assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with nite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classiication method to try to minimize bias. We use a Chi-square...
متن کاملAdaptive Kernel Metric Nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-ofdimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels t...
متن کاملLocally Adaptive Metric Nearest-Neighbor Classification
ÐNearest-neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest-neighbor rule. We propose a locally adaptive nearest-neighbor classification method to try to minimize bias. We use a Chi-s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Multimedia
سال: 2022
ISSN: ['1520-9210', '1941-0077']
DOI: https://doi.org/10.1109/tmm.2022.3227414